34 research outputs found
General Defocusing Particle Tracking: fundamentals and uncertainty assessment
General Defocusing Particle Tracking (GDPT) is a single-camera,
three-dimensional particle tracking method that determines the particle depth
positions from the defocusing patterns of the corresponding particle images.
GDPT relies on a reference set of experimental particle images which is used to
predict the depth position of measured particle images of similar shape. While
several implementations of the method are possible, its accuracy is ultimately
limited by some intrinsic properties of the acquired data, such as the
signal-to-noise ratio, the particle concentration, as well as the
characteristics of the defocusing patterns. GDPT has been applied in different
fields by different research groups, however, a deeper description and analysis
of the method fundamentals has hitherto not been available. In this work, we
first identity the fundamental elements that characterize a GDPT measurement.
Afterwards, we present a standardized framework based on synthetic images to
assess the performance of GDPT implementations in terms of measurement
uncertainty and relative number of measured particles. Finally, we provide
guidelines to assess the uncertainty of experimental GDPT measurements, where
true values are not accessible and additional image aberrations can lead to
bias errors. The data were processed using DefocusTracker, an open-source GDPT
software. The datasets were created using the synthetic image generator
MicroSIG and have been shared in a freely-accessible repository
Diving with microparticles in acoustic fields
Sound can move particles. A good example of this phenomenon is the Chladni
plate, in which an acoustic wave is induced in a metallic plate and particles
migrate to the nodes of the acoustic wave. For several years, acoustophoresis
has been used to manipulate microparticles in microscopic scales. In this fluid
dynamics video, submitted to the 30th Annual Gallery of Fluid Motion, we show
the basic mechanism of the technique and a simple way of visualize it. Since
acoustophoretic phenomena is essentially a three-dimensional effect, we employ
a simple technique to visualize the particles in 3D. The technique is called
Astigmatism Particle Tracking Velocimetry and it consists in the use of
cylindrical lenses to induce a deformation in the particle shape, which will be
then correlated with its distance from the observer. With this method we are
able to dive with the particles and observe in detail particle motion that
would otherwise be missed. The technique not only permits visualization but
also precise quantitative measurements that can be compared with theory and
simulations.Comment: Fluid dynamics video for the 30th Annual Gallery of Fluid Motion,
Entry #84160 65th Annual Meeting of the American Physical Society, Division
of Fluid Dynamics San Diego, CA, Nov 201
Ultrasound-induced acoustophoretic motion of microparticles in three dimensions
We derive analytical expressions for the three-dimensional (3D)
acoustophoretic motion of spherical microparticles in rectangular
microchannels. The motion is generated by the acoustic radiation force and the
acoustic streaming-induced drag force. In contrast to the classical theory of
Rayleigh streaming in shallow, infinite, parallel-plate channels, our theory
does include the effect of the microchannel side walls. The resulting
predictions agree well with numerics and experimental measurements of the
acoustophoretic motion of polystyrene spheres with nominal diameters of 0.537
um and 5.33 um. The 3D particle motion was recorded using astigmatism particle
tracking velocimetry under controlled thermal and acoustic conditions in a
long, straight, rectangular microchannel actuated in one of its transverse
standing ultrasound-wave resonance modes with one or two half-wavelengths. The
acoustic energy density is calibrated in situ based on measurements of the
radiation dominated motion of large 5-um-diam particles, allowing for
quantitative comparison between theoretical predictions and measurements of the
streaming induced motion of small 0.5-um-diam particles.Comment: 13 pages, 8 figures, Revtex 4.
Acoustic radiation- and streaming-induced microparticle velocities determined by micro-PIV in an ultrasound symmetry plane
We present micro-PIV measurements of suspended microparticles of diameters
from 0.6 um to 10 um undergoing acoustophoresis in an ultrasound symmetry plane
in a microchannel. The motion of the smallest particles are dominated by the
Stokes drag from the induced acoustic streaming flow, while the motion of the
largest particles are dominated by the acoustic radiation force. For all
particle sizes we predict theoretically how much of the particle velocity is
due to radiation and streaming, respectively. These predictions include
corrections for particle-wall interactions and ultrasonic thermoviscous
effects, and they match our measurements within the experimental uncertainty.
Finally, we predict theoretically and confirm experimentally that the ratio
between the acoustic radiation- and streaming-induced particle velocities is
proportional to the square of the particle size, the actuation frequency and
the acoustic contrast factor, while it is inversely proportional to the
kinematic viscosity.Comment: 11 pages, 9 figures, RevTex 4-
DefocusTracker:A Modular Toolbox for Defocusing-based, Single-Camera, 3D Particle Tracking
The need for single-camera 3D particle tracking methods is growing, among
others, due to the increasing focus in biomedical research often relying on
single-plane microscopy imaging. Defocusing-based methods are ideal for a
wide-spread use as they rely on basic microscopy imaging rather than requiring
additional non-standard optics. However, a wide-spread use has been limited by
the lack of accessible and easy-to-use software. DefocusTracker is an
open-source toolbox based on the universal principles of General Defocusing
Particle Tracking (GDPT) relying solely on a reference look-up table and image
recognition to connect a particle's image and its respective out-of-plane depth
coordinate. The toolbox is built in a modular fashion, allowing for easy
addition of new image recognition methods, while maintaining the same workflow
and external user interface. DefocusTracker is implemented in MATLAB, while a
parallel implementation in Python is in the preparation.Comment: 11 pages, 3 figures, submitted for Journal of Open Research Software
(JORS